Search Results: "dod"

15 June 2015

Lunar: Reproducible builds: week 7 in Stretch cycle

What happened about the reproducible builds effort for this week: Presentations On June 7th, Reiner Herrmann presented the project at the Gulaschprogrammiernacht 15 in Karlsruhe, Germany. Video and audio recordings in German are available, and so are the slides in English. Toolchain fixes Daniel Kahn Gillmor's report on help2man started a discussion with Brendan O'Dea and Ximin Luo about standardizing a common environment variable that would provide a replacement for an embedded build date. After various proposals and research by Ximin about date handling in several programming languages, the best solution seems to define SOURCE_DATE_EPOCH with a value suitable for gmtime(3).
  1. Martin Borgert wondered if Sphinx could be changed in a way that would avoid having to tweak debian/rules in packages using it to produce HTML documentation.
Daniel Kahn Gillmor opened a new report about icont producing unreproducible binaries. Packages fixed The following 32 packages became reproducible due to changes in their build dependencies: agda, alex, c2hs, clutter-1.0, colorediffs-extension, cpphs, darcs-monitor, dispmua, haskell-curl, haskell-glfw, haskell-glib, haskell-gluraw, haskell-glut, haskell-gnutls, haskell-gsasl, haskell-hfuse, haskell-hledger-interest, haskell-hslua, haskell-hsqml, haskell-hssyck, haskell-libxml-sax, haskell-openglraw, haskell-readline, haskell-terminfo, haskell-x11, jarjar-maven-plugin, kxml2, libcgi-struct-xs-perl, libobject-id-perl, maven-docck-plugin, parboiled, pegdown. The following packages became reproducible after getting fixed: Some uploads fixed some reproducibility issues but not all of them: Patches submitted which did not make their way to the archive yet: reproducible.debian.net A new variation to better notice when a package captures the environment has been introduced. (h01ger) The test on Debian packages works by building the package twice in a short time frame. But sometimes, a mirror push can happen between the first and the second build, resulting in a package built in a different build environment. This situation is now properly detected and will run a third build automatically. (h01ger) OpenWrt, the distribution specialized in embedded devices like small routers, is now being tested for reproducibility. The situation looks very good for their packages which seems mostly affected by timestamps in the tarball. System images will require more work on debbindiff to be better understood. (h01ger) debbindiff development Reiner Herrmann added support for decompling Java .class file and .ipk package files (used by OpenWrt). This is now available in version 22 released on 2015-06-14. Documentation update Stephen Kitt documented the new --insert-timestamp available since binutils-mingw-w64 version 6.2 available to insert a ready-made date in PE binaries built with mingw-w64. Package reviews 195 obsolete reviews have been removed, 65 added and 126 updated this week. New identified issues: Misc. Holger Levsen reported an issue with the locales-all package that Provides: locales but is actually missing some of the files provided by locales. Coreboot upstream has been quick to react after the announcement of the tests set up the week before. Patrick Georgi has fixed all issues in a couple of days and all Coreboot images are now reproducible (without a payload). SeaBIOS is one of the most frequently used payload on PC hardware and can now be made reproducible too. Paul Kocialkowski wrote to the mailing list asking for help on getting U-Boot tested for reproducibility. Lunar had a chat with maintainers of Open Build Service to better understand the difference between their system and what we are doing for Debian.

5 June 2015

Laura Arjona: Games

Nota: Este art culo, en espa ol, aqu . I m not a gamer, although probably I ve played with machines/computers more than most of the girls of my age. My path has been: handheld machine, Pong and tenis in my uncle s console, MSX (with that one, in addition to playing, I learnt what was an algorithm and a program, and I started to write small programs in Basic, and I copied and ran the source code of small games and programs to make graphics, which I found in MSX magazines). My parents considered that arcade machines in bars were like slot machines so they were banned for us (even pinball, only table soccer was saved from them, and only if my father was playing with us). In the MSX I played Magical Tree, Galaxians, Arkanoid, Konami s Sports Games, Spanish games from Dinamic, and some arcades like Golden Axe, Xenon, and maybe some more. The next computers (PC AT, later a 286) were not so for gaming; let s say that we played more with the printer (Harvard Graphics, Bannermania ). Later, I was interested in other things more than in computer games, and later there were highschool homework, dBase III, and later the University and programming again and more, and it was the end of gaming, computer was for office and Uni homework.
Later, it came the internet and since then, reading and writing and communicating was more interesting for me than playing. I was not good at playing, and if you are not good, you play less, and you don t get better, so you begin to find other ways to loose your time, or to win it :) The new generation My son is 6 years old now, and I m living with him a second adventure about games. Games have changed a lot, and the family computing try to stay in the libre software side whenever I am the one that can decide, so sometimes some challenges arise. Android (phone and tablet) The kid has played games in the phone and tablet with Android since he was a baby. We tried some of the last years popular games. I am not so keen of banning things, but I don t feel comfortable with the popular games for Android (advertisements, nonfree software, addictive elements, massive data recollection and possible surveillance ), so I try to control without being Cruella de Vil . Some techniques I use: On the other side, in my phone there is no Google Play, so we have been able to discover the section Games of F-Droid. We have tried (all of them available in F-Droid, emphasis in the ones that he liked best): 2048, AndroFish, Bomber, Coloring for Kids, Core, Dodge, Falling Blocks, Free Fall, Frozen Bubble, HeriSwap (this one in the tablet), Hex, HyperRogue, Meerkat Challenge, Memory, Pixel Dungeon, Robotfindskitten, Slow it!, Tux Memory, Tux Rider, Vector Pinball. Playing in my phone with CyanogenMod and having downloaded the games from F-Droid provides a relief similar to the one when playing a non-computer game. At least with the games that I have listed above. Maybe it is because they are simpler games, or they make me remember the ones that I played time ago. But it s also because of the peace of mind of knowing that they are libre software, that have been audited by the F-Droid community, that they don t abuse the user. The same happens with Debian, what takes me to the next part of this blogpost. Computer games: Debian The kid has learnt to play with the tablet and the phone before than with the computer, because our computers have no joysticks nor touchscreens. He learnt to use the touchpad before than the mouse, because it s easier and we have no mouse at home. He learnt to use the mouse at school, where they work with educative games via CD or via web, using Flash :( So Flash player appears and dissapears from my Debian setup depending on his willness to play with the school games . Until few time ago, in the computer we played with GCompris, ChildsPlay, and TuxPaint.
When he learnt to use the arrow keys, I installed Hannah and he liked a lot, specially when we learnt to do hannah -l 900 :) Later, in ClanTV there were advertisements about some online computer games about their favorite series, resulting in that they need Flash or a framework called Unity3D (no, it s not Ubuntu s Unity), and after digging a bit I decided that I was not going to install that #@%! in my Debian, so when he insisted in play those games, I booted the Windows 7 partition in his father s laptop and I installed it there. Windows is slow and sad in the computer, and those web games with that framework are not very light, so luckily they have not become very interesting. We have not played in the computer much more, maybe some incursion in Minetest, what takes me to the next section. (Not without stating my eternal thanks to the Games Team in Debian. I think they do a very important work and I think that next year I ll try to get involved in some way, because I know that the future of our family computer games is tied to libre games in Debian). PlayStation 3 and Minecraft Some time ago my husband bought a PlayStation 3 for home. The shop had discounts prices and so. He would play together with the kid an so.
The machine came home with some games for free (included in the price), but most of them were classified for +13 or so, so the only two left were Pro Evolution Soccer, and Minecraft.
I decided not to connect the machine to the network. Maybe we are loosing cool things, but I feel safer like that. So, no ethernet cable plugged, no registration in the Sony shop (or whatever its name is). The controllers are quite complex for the three of us. They are DualShock don t-know-what, and I think there is something (software) that makes the game adaptative to the person playing, because my husband is worse player after the son plays, if they play in turns and use the same controller. The kid liked Minecraft. I didn t know anything about that game (well, I knew that there was a libre clone called Minetest), so, for learning the basics I had a look at the wiki and searched videos about how to and we began learning. Now, the kid can read a bit so he needs less help, and he has watched a lot of videos about Minecraft, so he is interested in exploring and building. I had a look at Minetest, and I installed it in Debian. Having to use the keyboard is a disadvantage, and we didn t know how to dig, so it was not much attractive at first sight. I have looked a bit about how to use the PS3 controller in the computer, via USB, and it seems to work, but I suppose I need to write something to match each controller button with the corresponding key and subsequent action in Minetest. This is work, and I am lazy, and the boy seems not very interested in playing with the computer. Watching the videos we have infered that it s possible to download saved games and worlds to upload them in the videogame console. We have done some tests. I wanted to upload a saved game about an amusement park, but the file was in a folder of name NPEB01899* and even when the PS3 saw it to copy it from USB to the console, later it didn t appear in the list of saved games (our sved games were in folders named BLES01976). And renaming the folder didn t work, of course. I understood that we had met Sony s restrictions, so I searched for more info. The games are saved using an encryption key and you are not able to use saved games from consoles in other world zone or using a media different than ours (the game can be played using a disc or purchasing it in the digital shop, it seems). Very ugly all of this! I read somewhere that there is certain software (libre software, BTW) that allows to break the encryption and re-encrypt the saved game with the zone and type of media of your console, but it seems the program only works for Windows, and it needs a console ID that we have not, because we didn t register the console in the PlayStation network. All these things look shaky grounds for me, unpleasant stuff, I don t want to spend time on this, maybe I should learn a bit more about Minetest and make it work and interesting and tell Sony go fly a kite. Finally, I found a saved game in the same format as ours (BLES01976), it s not an amusement park but it is a world with interesting places to explore and many things already built, so I ve tried to import it and it worked, so my son will be happy for some time, I suppose. We have tried Minetest in the tablet too, but the touchscreen is not comfortable for this kind of games. I feel quite frustrated and angry about this issue of Sony s restrictions on saved games. So I suppose that in the next months I ll try to learn more about Minetest in Debian, game controllers in Debian, and games in Debian in general. So I hope to be able to offer cool stuff to my son, and he becomes more interested in playing in a safe environment which does not abuse the user. And with this, I finish Libre games in GNU/Linux, Debian, and info about games in internet When we have searched info about games in internet, I found that many times you need to go out from the secure environment: webpages with links to downloads that who knows if they contain what they say they contain, advertisements, videoblogs with a language not adequate for kids (or any person that loves their mother language) That s why I believe the path is to go into detail about libre games provided by the distro you use (Debian in my case). Here I bookmark a list of website with info that surely will be useful for me, to read in depth: We ll see how it goes. Comments? You can comment in this Pump.io thread.
Filed under: My experiences and opinion Tagged: Debian, English, F-Droid, Free culture, Free Software, Games, libre software, Moving into free software

10 March 2015

Joey Hess: 7drl 2015 day 4 coding through exhaustion

Slow start today; I was pretty exhausted after yesterday and last night's work. Somehow though, I got past the burn and made major progress today. All the complex movement of both the player and the scroll is finished now, and all that remains is to write interesting spells, and a system for learning spells, and to balance out the game difficulty.
I haven't quite said what Scroll is about yet, let's fix that: In Scroll, you're a bookworm that's stuck on a scroll. You have to dodge between words and use spells to make your way down the page as the scroll is read. Go too slow and you'll get wound up in the scroll and crushed. The character is multiple chars in size (size is the worm's only stat), and the worm interacts with the scroll in lots of ways, like swallowing letters, or diving through a hole to the other side of the scroll. While it can swallow some letters, if it gets too full, it can't move forward anymore, so letters are mostly consumed to be used as spell components. I think that I will manage to get away without adding any kind of monsters to the game; the scroll (and whoever is reading it) is the antagonist. As I'm writing this very post, I'm imagining the worm wending its way through my paragraphs. This dual experience of text, where you're both reading its content and hyper-aware of its form, is probably the main thing I wanted to explore in writing Scroll. As to the text that fills the scroll, it's broadly procedurally generated, in what I hope are unusual and repeatedly surprising (and amusing) ways. I'm not showing any screenshots of the real text, because I don't want to give that surprise away. But, the other thing about Scroll is that it's scroll, a completely usable (if rather difficult..) Unix pager!

15 February 2015

Matthew Palmer: The Vicious Circle of Documentation

Ever worked at a company (or on a codebase, or whatever) where it seemed like, no matter what the question was, the answer was written down somewhere you could easily find it? Most people haven t, sadly, but they do exist, and I can assure you that it is an absolute pleasure. On the other hand, practically everyone has experienced completely undocumented systems and processes, where knowledge is shared by word-of-mouth, or lost every time someone quits. Why are there so many more undocumented systems than documented ones out there, and how can we cause more well-documented systems to exist? The answer isn t people are lazy , and the solution is simple though not easy.

Why Johnny Doesn t Read When someone needs to know something, they might go look for some documentation, or they might ask someone else or just guess wildly. The behaviour look for documentation is often reinforced negatively, by the result documentation doesn t exist . At the same time, the behaviours ask someone and guess wildly are positively reinforced, by the results I get my question answered and/or at least I can get on with my work . Over time, people optimise their behaviour by skipping the look for documentation step, and just go straight to asking other people (or guessing wildly).

Why Johnny Doesn t Write When someone writes documentation, they re hoping that people will read it and not have to ask them questions in order to be productive and do the right thing. Hence, the behaviour write documentation is negatively reinforced by the results I still get asked questions , and nobody does things the right way around here, dammit! Worse, though, is that there is very little positive reinforcement for the author: when someone does read the docs, and thus doesn t ask a question, the author almost certainly doesn t know they dodged a bullet. Similarly, when someone does things the right way, it s unlikely that anyone will notice. It s only the mistakes that catch the attention. Given that the experience of writing documentation tends to skew towards the negative, it s not surprising that eventually, the time spent writing documentation is reallocated to other, more utility-producing activities.

Death Spiral The combination of these two situations is self-reinforcing. While a suitably motivated reader might start by strictly looking for documentation, or an author initially be enthused to always fully documenting their work, over time the reflex will be for readers to just go ask someone, because there s never any documentation! , and for authors to not write documentation because nobody bothers to read what I write anyway! . It is important to recognise that this iterative feedback loop is the natural state of the reader/author ecosystem, resulting in something akin to thermodynamic entropy. To avoid the system descending into chaos, energy needs to be constantly applied to keep the system in order.

The Solution Effective methods for avoiding the vicious circle can be derived from the things that cause it. Change the forces that apply themselves to readers and authors, and they will behave differently. On the reader s side, the most effective way to encourage people to read documentation is for it to consistently exist. This means that those in control of a project or system mustn t consider something done until the documentation is in a good state. Patches shouldn t be landed, and releases shouldn t be made, unless the documentation is altered to match the functional changes being made. Yes, this requires discipline, which is just a form of energy application to prevent entropic decay. Writing documentation should be an explicit and well-understood part of somebody s job description. Whoever is responsible for documentation needs to be given the time to do it properly. Writing well takes time and mental energy, and that time needs to be factored into the plans. Never forget that skimping on documentation, like short-changing QA or customer support, is a false economy that will cost more in the long term than it saves in the short term. Even if the documentation exists, though, some people are going to tend towards asking people rather than consulting the documentation. This isn t a moral failing on their part, but only happens when they believe that asking someone is more beneficial to them than going to the documentation. To change the behaviour, you need to change the belief. You could change the belief by increasing the cost of asking. You could fire (or hellban) anyone who ever asks a question that is answered in the documentation. But you shouldn t. You could yell RTFM! at everyone who asks a question. Thankfully that s one acronym that s falling out of favour. Alternately, you can reduce the cost of getting the answer from the documentation. Possibly the largest single productivity boost for programmers, for example, has been the existence of Google. Whatever your problem, there s a pretty good chance that a search or two will find a solution. For your private documentation, you probably don t have the power of Google available, but decent full-text search systems are available. Use them. Finally, authors would benefit from more positive reinforcement. If you find good documentation, let the author know! It requires a lot of effort (comparatively) to look up an author s contact details and send them a nice e-mail. The like button is a more low-energy way of achieving a similar outcome you click the button, and the author gets a warm, fuzzy feeling. If your internal documentation system doesn t have some way to close the loop and let readers easily give authors a bit of kudos, fix it so it does. Heck, even if authors just know that a page they wrote was loaded N times in the past week, that s better than the current situation, in which deafening silence persists, punctuated by the occasional plaintive cry of Hey, do you know how to ? . Do you have any other ideas for how to encourage readers to read, and for authors to write?

3 January 2015

Russ Allbery: Review: The Ring of Charon

Review: The Ring of Charon, by Roger MacBride Allen
Series: Hunted Earth #1
Publisher: Tor
Copyright: December 1990
ISBN: 0-8125-3014-4
Format: Mass market
Pages: 500
Larry Chao is a junior scientist at a gravity research on Pluto, at the very outer limits of human reach in the solar system. The facility is for researching artificial gravity, which is one reason it's in the middle of nowhere. Another is that their experimental generator is built in a ring around Charon, and the close proximity of the two bodies is useful for scientific observation. Unfortunately, there hasn't been much to observe. They can create very short-lived gravity fields in very small areas, but nothing like the artificial gravity that was the original promise of the facility. As a result, the government is shutting the facility down. The authoritarian director, Simon Raphael, is... not exactly happy with that decision, but resigned to it and running the facility to completion with a sullen anger. When Larry makes a startling breakthrough at nearly the last minute for the society, Simon is uninterested and hostile. This leads Larry and his fellow scientist Sondra Berghoff to attempt a more radical demonstration of Larry's success and prove to the rest of the solar system that the facility should be kept open. That decision has a far deeper impact on humanity and the solar system than they could have possibly imagined. The Ring of Charon and its sequel, Shattered Sphere, were recommended to me as good harder science fiction. It took me a while to track down copies in fact, it took an in-person trip to Powell's. Once I found them, a relatively straightforward, old-school science fiction novel seemed like just the thing to read during my commute. Allen delivers there. I'm not spoiling the main plot driver of the book even though it's given away on the back cover, since it's some time in coming in the novel. But The Ring of Charon turns into a multi-viewpoint cross between a disaster novel and a scientific investigation. Larry and Sondra stay central to the plot, of course, but Allen adds a variety of other characters who are attempting to figure out what happened to the solar system and then deal with the consequences: everyone from scientists to pilots to communications officers in weird surrealistic stations. The science is mostly believable, apart from the scientific breakthrough that motivates the plot. Characterization isn't absent completely, but it's simple and unsubtle; dialogue is a bit wooden, characters aren't entirely multidimensional, but Allen does a reasonably good job with both pacing and the sense of mystery and investigation, and a surprisingly good job portraying organizational politics. As you might guess from the tone of my review, this is not the book to reach for if you want something ground-breaking. It's a very conventional, multi-viewpoint SF novel full of scientists and investigation of unknown and possibly hostile phenomena. If you've read much science fiction, you've read books like this before. But one thing that Allen does surprisingly well, which makes The Ring of Charon stand a bit above the pack, is that he doesn't write villains. Even Simon, who goes out of his way to make the reader hate him at the start of the book, becomes a surprisingly sympathetic character. The characters who are usually villains or at least foils in books like this the smooth PR person, the religious man, the blindly-focused scientist who isn't interested in anyone else's theories never turn into caricatures, play important roles in the plot, and turn out to be competent in their fields of expertise. There are actual villains, sort of, but I found myself feeling sympathetic even towards them, at least in places. Allen takes a rather old SF dodge to achieve conflict without an evil enemy, and, because of that, the end of the book felt like a bit of an anticlimax. But I did like the feel of the book where there isn't a good versus evil fight, just a wide variety of people (and others) trying to understand and control the universe in the best ways they know how. I'm not sure I can quite recommend this book. The quality of the writing is not particularly high, and I'm not generally a fan of the disaster novel style of storytelling. But despite not being very original, there's just something likable about it. It moves along reasonably well for a 500 page book, and it's refreshingly light on irritating stereotypes. I think one has to be in the right mood when reading it and set expectations accordingly, but it fit what I was looking for when I picked it up. One warning, though: although The Ring of Charon reaches a climax, the major plot conflict is not resolved at the end of this book, so you may want to have the sequel around. Followed by Shattered Sphere. Rating: 6 out of 10

21 December 2014

Russ Allbery: Review: 2014 Hugos: Novelettes

Review: 2014 Hugos: Novelettes, edited by Loncon 3
Publisher: Loncon 3
Copyright: 2014
Format: Kindle
This is another weird "book review" covering the Hugo-nominated novelettes for the 2014 Hugos (given for works published in 2014) at Loncon 3, the 2014 Worldcon. The "editor" is the pool of attendees and supporting members who chose to nominate works, all of which had been previously edited by other editors in their original publication. I received all of these as part of the Hugo voter's packet for being a supporting member, but they all appear to be available for free on the Internet (at least at the time of this writing). "The Exchange Officers" by Brad Torgersen: An okay, if not particularly ground-breaking, military SF story, ruined for me by the ham-handed introduction of superficial jingoism. The protagonists are doing a tour as remote operators of humanoid battle suits in orbit: not a new premise, but a servicable one. Since this is military SF, they predictably have to defend a space installation against attackers. So we get a bit of drama, a bit of zero-g combat, and the fun of people learning how to remotely operate suits. You've probably read this before, but it passes the time reasonably well. Unfortunately, Torgersen decided to make the villains the Chinese military for no adequately-explained reason. (Well, I'm being kind; I suspect the reason is the standard yellow peril nonsense, but that's less generous.) So there is snide commentary about how only the military understand the Chinese threat and a fair bit of old-fashioned jingoism mixed in to the story, to its detriment. If you like this sort of thing, it's a typical example, although it escapes me why people thought it was exceptional enough to warrant nomination. (5) "The Lady Astronaut of Mars" by Mary Robinette Kowal: Once again, my clear favorite among the stories also won, which is a lovely pattern. Elma was the female astronaut in an alternate history in which manned space exploration continued to grow, leading to permanent settlement on Mars. She spent lots of time being photographed, being the smiling face of the space program, while her husband worked on the math and engineering of the launches. Now, she's an old woman, taking care of her failing and frail husband, her career behind her. Or so she thinks, before an offer that forces an impossible choice between space and staying with her husband for his final days. This is indeed the tear-jerker that it sounds like, but it's not as maudlin as it might sound. Kowal does an excellent job with Elma's characterization: she's no-nonsense, old enough to be confident in her opinions, and knows how to navigate through the world. The story is mixed with nostalgia and memories, including a reminder of just what Elma meant to others. It touches on heroism, symbolism, and the horrible choices around dying loved ones, but I thought it did so deftly and with grace. I was expecting the story to be too obvious, but I found I enjoyed the quotidian feel. It's not a story to read if you want to be surprised, but I loved the small touches. (9) "Opera Vita Aeterna" by Vox Day: Before the review, a note that I consider obligatory. The author of this story is an aggressively misogynistic white supremacist, well-known online for referring to black people as savages and arguing women should not be allowed to vote. To what extent you choose to take that into account when judging his fiction is up to you, but I don't think it should go unsaid. "Opera Vita Aeterna" is the story of a monastery in a typical fantasy world (at least as far as one can tell from this story; readers of Vox Day's fantasy series will probably know more background). At the start of the story, it gets an unexpected visit from an elf. Not just any elf, either, but one of the most powerful magicians of his society. He comes to the monastery out of curiousity about the god that the monks worship and stays for a project of illuminating their scriptures, while having theological debates with the abbot. This story is certainly not the offensive tirade that you might expect from its author. Its biggest problem is that nothing of substance happens in the story, either theologically or via more conventional action. It's a lot of description, a lot of talking, a lot of warmed-over Christian apologetics that dodges most of the hard problems, and a lot of assertions that the elf finds something of interest in this monastery. I can believe this could be the case, but Vox Day doesn't really show why. There is, at the end of the story, some actual drama, but I found it disappointing and pointless. It leads nowhere. The theology has the same problem: elves supposedly have no souls, which is should be the heart of a theological question or conflict Vox Day is constructing, but that conflict dies without any resolution. We know nothing more about the theology of this world at the end of the story than we do at the beginning. Some of the descriptions here aren't bad, and the atmosphere seems to want to develop into a story. But that development never happens, leaving the whole work feeling fundamentally pointless. (4) "The Truth of Fact, the Truth of Feelng" by Ted Chiang: This is another oddly-constructed story, although I think a bit more successful. It's a story in two interwoven parts. One is a fictional essay, told in a non-fiction style, about a man living in a future world with ubiquitous life recording and very efficient search software. Any part of one's life can be easily found and reviewed. The other is the story of a boy from a tribal culture during European colonialism. He learns to read and write, and from that a respect for written records, which come into conflict with the stories that the tribe elders tell about the past. The purpose of both of these stories is to question both the value and the implications of recording everything in a way that preserves and guarantees the facts instead of individual interpretations. The boy's story calls this into question; the narrator's story offers ambiguous support for its value and a deeper plea for giving people space to change. I found the style a bit difficult to get used to, since much of it did not feel like a story. But it grew on me as I read it, and the questions Chiang raises have stuck with me since. The problem of how and when to allow for change in others when we have perfect (or at least vastly improved) memory is both important and complicated, and this is one of the better presentations of the problem that I've seen. It's more of a think-y piece, and closer to non-fiction than a story, but I thought it was worth reading. (8) "The Waiting Stars" by Aliette de Bodard: I keep wanting to like de Bodard's space opera world of AIs and living ships, but it never quite works for me. I've read several stories set in this universe now, and it has some neat ideas, but I always struggle with the characters. This story at least doesn't have quite as much gruesome pregnancy as the previous ones (although there's still some). "The Waiting Stars" opens with a raid on a ship graveyard, an attempt to rescue and "reboot" an AI starship under the guidance of another Mind. This is intermixed with another story about a woman who was apparently rescued in childhood from birthing ship Minds and raised in a sort of foster institution. This feels like a flashback at first, but its interaction with the rest of the story is something more complicated. The conceptual trick de Bodard pulls here is thought-provoking, but once again I struggled to care about all of the characters. I also found the ending discouraging and unsatisfying, which didn't help. Someone who isn't me might really like this story, but it wasn't my thing. (6) Rating: 6 out of 10

4 December 2014

Olivier Berger: Shell script to connecting to a Shibboleth protected web app with curl

Here s a shell script I ve created (reusing one meant for CAS protected resources), which will allow to connect to a Web application protected by the Shibboleth SSO mechanism.
It uses cURL to navigate through the various jumps required by the protocol, perform the necessary posts, etc.
I haven t read the Shibboleth specs, so it may not be the best way, and may not work in all cases, but that was enough for my case, at least. Feel free to improve it on Github Gists. After the connection is succesful, one may reuse the .cookieJar file to perform further cURL connections, or even some automated content mirroring with httrack, for instance (see a previous experiment of mine with httrack for Moodle).

23 November 2014

Matthew Palmer: You stay classy, Uber

You may have heard that Uber has been under a bit of fire lately for its desires to hire private investigators to dig up dirt on journalists who are critical of Uber. From using users ride data for party entertainment, putting the assistance dogs of blind passengers in the trunk, adding a surcharge to reduce the number of dodgy drivers, or even booking rides with competitors and then cancelling, or using the ride to try and convince the driver to change teams, it s pretty clear that Uber is a pretty good example of how companies are inherently sociopathic. However, most of those examples are internal stupidities that happened to be made public. It s a very rare company that doesn t do all sorts of shady things, on the assumption that the world will never find out about them. Uber goes quite a bit further, though, and is so out-of-touch with the world that it blogs about analysing people s sexual activity for amusement. You ll note that if you follow the above link, it sends you to the Wayback Machine, and not Uber s own site. That s because the original page has recently turned into a 404. Why? Probably because someone at Uber realised that bragging about how Uber employees can amuse themselves by perving on your one night stands might not be a great idea. That still leaves the question open of what sort of a corporate culture makes anyone ever think that inspecting user data for amusement would be a good thing, let alone publicising it? It s horrific. Thankfully, despite Uber s fairly transparent attempt at whitewashing ( clearwashing ?), the good ol Wayback Machine helps us to remember what really went on. It would be amusing if Uber tried to pressure the Internet Archive to remove their copies of this blog post (don t bother, Uber; I ve got a Save As button and I m not afraid to use it). In any event, I ve never used Uber (not that I ve got one-night stands to analyse, anyway), and I ll certainly not be patronising them in the future. If you re not keen on companies amusing themselves with your private data, I suggest you might consider doing the same.

24 October 2014

Enrico Zini: systemd-cryptsetup-password

cryptsetup password and parallel boot Since parallel boot happened, during boot the cryptsetup password prompt in my system gets flooded with other boot messages. I fixed it, as suggested in #764555, installing plymouth, then editing /etc/default/grub to add splash to GRUB_CMDLINE_LINUX_DEFAULT:
GRUB_CMDLINE_LINUX_DEFAULT="quiet splash"
Besides showing pretty pictures (and most importantly, getting them out of my way if I press ESC), plymouth also provides a user prompt that works with parallel boot which sounds like what I needed.

14 October 2014

Julian Andres Klode: Key transition

I started transitioning from 1024D to 4096R. The new key is available at: https://people.debian.org/~jak/pubkey.gpg and the keys.gnupg.net key server. A very short transition statement is available at: https://people.debian.org/~jak/transition-statement.txt and included below (the http version might get extended over time if needed). The key consists of one master key and 3 sub keys (signing, encryption, authentication). The sub keys are stored on an OpenPGP v2 Smartcard. That s really cool, isn t it? Somehow it seems that GnuPG 1.4.18 also works with 4096R keys on this smartcard (I accidentally used it instead of gpg2 and it worked fine), although only GPG 2.0.13 and newer is supposed to work.
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1,SHA512
Because 1024D keys are not deemed secure enough anymore, I switched to
a 4096R one.
The old key will continue to be valid for some time, but i prefer all
future correspondence to come to the new one.  I would also like this
new key to be re-integrated into the web of trust.  This message is
signed by both keys to certify the transition.
the old key was:
pub   1024D/00823EC2 2007-04-12
      Key fingerprint = D9D9 754A 4BBA 2E7D 0A0A  C024 AC2A 5FFE 0082 3EC2
And the new key is:
pub   4096R/6B031B00 2014-10-14 [expires: 2017-10-13]
      Key fingerprint = AEE1 C8AA AAF0 B768 4019  C546 021B 361B 6B03 1B00
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2
iEYEARECAAYFAlQ9j+oACgkQrCpf/gCCPsKskgCgiRn7DoP5RASkaZZjpop9P8aG
zhgAnjHeE8BXvTSkr7hccNb2tZsnqlTaiQIcBAEBCgAGBQJUPY/qAAoJENc8OeVl
gLOGZiMP/1MHubKmA8aGDj8Ow5Uo4lkzp+A89vJqgbm9bjVrfjDHZQIdebYfWrjr
RQzXdbIHnILYnUfYaOHUzMxpBHya3rFu6xbfKesR+jzQf8gxFXoBY7OQVL4Ycyss
4Y++g9m4Lqm+IDyIhhDNY6mtFU9e3CkljI52p/CIqM7eUyBfyRJDRfeh6c40Pfx2
AlNyFe+9JzYG1i3YG96Z8bKiVK5GpvyKWiggo08r3oqGvWyROYY9E4nLM9OJu8EL
GuSNDCRJOhfnegWqKq+BRZUXA2wbTG0f8AxAuetdo6MKmVmHGcHxpIGFHqxO1QhV
VM7VpMj+bxcevJ50BO5kylRrptlUugTaJ6il/o5sfgy1FdXGlgWCsIwmja2Z/fQr
ycnqrtMVVYfln9IwDODItHx3hSwRoHnUxLWq8yY8gyx+//geZ0BROonXVy1YEo9a
PDplOF1HKlaFAHv+Zq8wDWT8Lt1H2EecRFN+hov3+lU74ylnogZLS+bA7tqrjig0
bZfCo7i9Z7ag4GvLWY5PvN4fbws/5Yz9L8I4CnrqCUtzJg4vyA44Kpo8iuQsIrhz
CKDnsoehxS95YjiJcbL0Y63Ed4mkSaibUKfoYObv/k61XmBCNkmNAAuRwzV7d5q2
/w3bSTB0O7FHcCxFDnn+tiLwgiTEQDYAP9nN97uibSUCbf98wl3/
=VRZJ
-----END PGP SIGNATURE-----

Filed under: Uncategorized

Julian Andres Klode: Key transition

I started transitioning from 1024D to 4096R. The new key is available at: https://people.debian.org/~jak/pubkey.gpg and the keys.gnupg.net key server. A very short transition statement is available at: https://people.debian.org/~jak/transition-statement.txt and included below (the http version might get extended over time if needed). The key consists of one master key and 3 sub keys (signing, encryption, authentication). The sub keys are stored on an OpenPGP v2 Smartcard. That s really cool, isn t it? Somehow it seems that GnuPG 1.4.18 also works with 4096R keys on this smartcard (I accidentally used it instead of gpg2 and it worked fine), although only GPG 2.0.13 and newer is supposed to work.
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1,SHA512
Because 1024D keys are not deemed secure enough anymore, I switched to
a 4096R one.
The old key will continue to be valid for some time, but i prefer all
future correspondence to come to the new one.  I would also like this
new key to be re-integrated into the web of trust.  This message is
signed by both keys to certify the transition.
the old key was:
pub   1024D/00823EC2 2007-04-12
      Key fingerprint = D9D9 754A 4BBA 2E7D 0A0A  C024 AC2A 5FFE 0082 3EC2
And the new key is:
pub   4096R/6B031B00 2014-10-14 [expires: 2017-10-13]
      Key fingerprint = AEE1 C8AA AAF0 B768 4019  C546 021B 361B 6B03 1B00
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v2
iEYEARECAAYFAlQ9j+oACgkQrCpf/gCCPsKskgCgiRn7DoP5RASkaZZjpop9P8aG
zhgAnjHeE8BXvTSkr7hccNb2tZsnqlTaiQIcBAEBCgAGBQJUPY/qAAoJENc8OeVl
gLOGZiMP/1MHubKmA8aGDj8Ow5Uo4lkzp+A89vJqgbm9bjVrfjDHZQIdebYfWrjr
RQzXdbIHnILYnUfYaOHUzMxpBHya3rFu6xbfKesR+jzQf8gxFXoBY7OQVL4Ycyss
4Y++g9m4Lqm+IDyIhhDNY6mtFU9e3CkljI52p/CIqM7eUyBfyRJDRfeh6c40Pfx2
AlNyFe+9JzYG1i3YG96Z8bKiVK5GpvyKWiggo08r3oqGvWyROYY9E4nLM9OJu8EL
GuSNDCRJOhfnegWqKq+BRZUXA2wbTG0f8AxAuetdo6MKmVmHGcHxpIGFHqxO1QhV
VM7VpMj+bxcevJ50BO5kylRrptlUugTaJ6il/o5sfgy1FdXGlgWCsIwmja2Z/fQr
ycnqrtMVVYfln9IwDODItHx3hSwRoHnUxLWq8yY8gyx+//geZ0BROonXVy1YEo9a
PDplOF1HKlaFAHv+Zq8wDWT8Lt1H2EecRFN+hov3+lU74ylnogZLS+bA7tqrjig0
bZfCo7i9Z7ag4GvLWY5PvN4fbws/5Yz9L8I4CnrqCUtzJg4vyA44Kpo8iuQsIrhz
CKDnsoehxS95YjiJcbL0Y63Ed4mkSaibUKfoYObv/k61XmBCNkmNAAuRwzV7d5q2
/w3bSTB0O7FHcCxFDnn+tiLwgiTEQDYAP9nN97uibSUCbf98wl3/
=VRZJ
-----END PGP SIGNATURE-----

Filed under: Uncategorized

13 October 2014

John Goerzen: Update on the systemd issue

The other day, I wrote about my poor first impressions of systemd in jessie. Here s an update. I d like to start with the things that are good. I found the systemd community to be one of the most helpful in Debian, and #debian-systemd IRC channel to be especially helpful. I was in there for quite some time yesterday, and appreciated the help from many people, especially Michael. This is a nontechnical factor, but is extremely important; this has significantly allayed my concerns about systemd right there. There are things about the systemd design that impress. The dependency system and configuration system is a lot more flexible than sysvinit. It is also a lot more complicated, and difficult to figure out what s happening. I am unconvinced of the utility of parallelization of boot to begin with; I rarely reboot any of my Linux systems, desktops or servers, and it seems to introduce needless complexity. Anyhow, on to the filesystem problem, and a bit of a background. My laptop runs ZFS, which is somewhat similar to btrfs in that it s a volume manager (like LVM), RAID manager (like md), and filesystem in one. My system runs LVM, and inside LVM, I have two ZFS pools (volume groups): one, called rpool, that is unencrypted and holds mainly the operating system; and the other, called crypt, that is stacked atop LUKS. ZFS on Linux doesn t yet have built-in crypto, which is why LVM is even in the picture here (to separate out the SSD at a level above ZFS to permit parts of it to be encrypted). This is a bit of an antiquated setup for me; as more systems have AES-NI, I m going to everything except /boot being encrypted. Anyhow, inside rpool is the / filesystem, /var, and /usr. Inside /crypt is /tmp and /home. Initially, I tried to just boot it, knowing that systemd is supposed to work with LSB init scripts, and ZFS has init scripts with carefully-planned dependencies. This was evidently not working, perhaps because /lib/systemd/systemd/ It turns out that systemd has a few assumptions that turn out to be less true with ZFS than otherwise. ZFS filesystems are normally not mounted via /etc/fstab; a ZFS pool has internal properties about which dataset gets mounted where (similar to LVM s actions after a vgscan and vgchange -ay). Even though there are ordering constraints in the units, systemd is writing files to /var before /var gets mounted, resulting in the mount failing (unlike ext4, ZFS by default will reject an attempt to mount over a non-empty directory). Partly this due to the debian-fixup.service, and partly it is due to systemd reacting to udev items like backlight. This problem was eventually worked around by doing zfs set mountpoint=legacy rpool/var, and then adding a line to fstab ( rpool/var /var zfs defaults 0 2 ) for /var and its descendent filesystems. This left the problem of /tmp; again, it wasn t getting mounted soon enough. In this case, it required crypttab to be processed first, and there seem to be a lot of bugs in the crypttab processing in systemd (more on that below). I eventually worked around that by adding After=cryptsetup.target to the zfs-import-cache.service file. For /tmp, it did NOT work to put it in /etc/fstab, because then it tried to mount it before starting cryptsetup for some reason. It probably didn t help that the system s cryptdisks.service is a symlink to /dev/null, a fact I didn t realize until after a lot of needless reboots. Anyhow, one thing I stumbled across was poor console control with systemd. On numerous occasions, I had things like two cryptsetup processes trying to read a password, plus an emergency mode console trying to do so. I had this memorable line of text at one point: (or type Control-D to continue): Please enter passphrase for disk athena-crypttank (crypt)! [ OK ] Stopped Emergency Shell. And here we venture into unsatisfying territory with systemd. One answer to this in IRC was to install plymouth, which apparently serializes console I/O. However, plymouth is an attractive boot animation in place of the text messages that normally get shown. I don t want an attractive boot animation . Nevertheless, neither systemd-sysv nor cryptsetup depends on plymouth, so by default, the prompt for a password at boot is obscured by various other text. Worse, plymouth doesn t support serial consoles, so at the moment booting a system that uses LUKS with systemd over a serial console is a matter of blind luck of typing the right password at the right time. In the end, though, the system booted and after a few more tweaks, the backlight buttons do their thing again. Whew! Update 2014-10-13: uau pointed out that Plymouth is more than a bootsplash, and can work with serial consoles, despite the description of the package. I stand corrected on that. (It is still the case, however, that packages don t depend on it where they should, and the default experience for people using cryptsetup is not very good.)

3 October 2014

Ian Donnelly: Config::Model and Elektra

Hi Everybody, Today I want to talk about the different approaches of Elektra and Config::Model. We have gotten a lot of questions lately about why Elektra is necessary and what differentiates it between similar tools like Config::Model. While there are a lot of similarities between Config::Model and Elektra, there are some key differences and that is what I will be focusing on in this post. Once a specification is defined for Elektra and a plug-in is written to work with that specification, other developers will be able to reuse these specifications for programs that have similar configurations (such a specification and plug-in for the ini file type.) Additionally, specifications, once defined in KDB can be used across multiple programs. For instance, if I were to define a specification for my program within KDB:
[/myapp/file_dialog/show_hidden_files]
type=Boolean
default=true

Any other program could use my specification just by referring to show_hidden_files. These features allow Elektra to solve the problem of cross-platform configurations by providing a consistent API and also allow users to easily be aware of other applications configurations which allows for easier integration between programs. Config:: Model also moves to provide a unified interface for configuration data and it also supports validation such as the type=Boolean like in the above example. The biggest differences between these two projects is that Elektra is intended for use by the programs themselves and by external GUIs and validation tools unlike Config::Model. Config::Model provides a tool allowing developers to provide a means for users to interactively edit configuration data in a safe way. Additionally, Elektra uses self-inscribing data. This means that all the specifications are saved within KDB and in metadata. More differences are that validators can be written in any language for Elektra because the specifications are just stored as data and they can enforce constraints on any access because plug-ins define the behavior of KDB itself. Tying this all together with my GSoC project is the topic of three-way merges. Config::Model actually does not rely on a base for merges since the specifications all must be complete. This is a very good approach to handle merges in an advanced way too. This is an avenue that Elektra would like to explore in the future when we have enough specifications to handle all types on configuration. I hope that this post clarifies the different approaches of Elektra and Config::Model. While both of these tools offer a better answer to configuration files they do have different goals and implementations that make them unique. I want to mention that we have a good relationship with the developers of Config::Model, who supported my Google Summer of Code Project. We believe that both of these tools have their own place and uses and they do not compete to achieve the same goals. For now,
Ian S. Donnelly

14 September 2014

Gregor Herrmann: RC bugs 2014/34-37

the perl 5.20 transition is over, debconf14 is over, so I should have more time for RC bugs? yes & no: I fixed some, but only in "our" (as in: pkg-perl) packages:

24 July 2014

Matthew Palmer: First Step with Clojure: Terror

$ sudo apt-get install -y leiningen
[...]
$ lein new scratch
[...]
$ cd scratch
$ lein repl
Downloading: org/clojure/clojure/1.3.0/clojure-1.3.0.pom from repository central at http://repo1.maven.org/maven2
Transferring 5K from central
Downloading: org/sonatype/oss/oss-parent/5/oss-parent-5.pom from repository central at http://repo1.maven.org/maven2
Transferring 4K from central
Downloading: org/clojure/clojure/1.3.0/clojure-1.3.0.jar from repository central at http://repo1.maven.org/maven2
Transferring 3311K from central
[...]
Wait what? lein downloads some random JARs from a website over HTTP1, with, as far as far I can tell, no verification that what I m asking for is what I m getting (has nobody ever heard of Man-in-the-Middle attacks in Maven land?). It downloads a .sha1 file to (presumably) do integrity checking, but that s no safety net if I can serve you a dodgy .jar, I can serve you an equally-dodgy .sha1 file, too (also, SHA256 is where all the cool kids are at these days). Finally, jarsigner tells me that there s no signature on the .jar itself, either. It gets better, though. The repo1.maven.org site is served by the fastly.net2 pseudo-CDN3, which adds another set of points in the chain which can be subverted to hijack and spoof traffic. More routers, more DNS zones, and more servers. I ve seen Debian take a kicking more than once because packages aren t individually signed, or because packages aren t served over HTTPS. But at least Debian s packages can be verified by chaining to a signature made by a well-known, widely-distributed key, signed by two Debian Developers with very well-connected keys. This repository, on the other hand oy gevalt. There are OpenPGP (GPG) signatures available for each package (tack .asc onto the end of the .jar URL), but no attempt was made to download the signatures for the .jar I downloaded. Even if the signature was downloaded and checked, there s no way for me (or anyone) to trust the signature the signature was made by a key that s signed by one other key, which itself has no signatures. If I were an attacker, it wouldn t be hard for me to replace that key chain with one of my own devising. Even ignoring everyone living behind a government- or company-run intercepting proxy, and everyone using public wifi, it s pretty well common knowledge by now (thanks to Edward Snowden) that playing silly-buggers with Internet traffic isn t hard to do, and there s no shortage of evidence that it is, in fact, done on a routine basis by all manner of people. Serving up executable code to a large number of people, in that threat environment, with no way for them to have any reasonable assurance that code is trustworthy, is very disappointing. Please, for the good of the Internet, improve your act, Maven. Putting HTTPS on your distribution would be a bare minimum. There are attacks on SSL, sure, but they re a lot harder to pull off than sitting on public wifi hijacking TCP connections. Far better would be to start mandating signatures, requiring signature checks to pass, and having all signatures chain to a well-known, widely-trusted, and properly secured trust root. Signing all keys that are allowed to upload to maven.org with a maven.org distribution root key (itself kept in hardware and only used offline), and then verifying that all signatures chain to that key, wouldn t be insanely difficult, and would greatly improve the security of the software supply chain. Sure, it wouldn t be perfect, but don t make the perfect the enemy of the good. Cost-effective improvements are possible here. Yes, security is hard. But you don t get to ignore it just because of that, when you re creating an attractive nuisance for anyone who wants to own up a whole passel of machines by slipping some dodgy code into a widely-used package.
  1. To add insult to injury, it appears to ignore my http_proxy environment variable, and the repo1.maven.org server returns plain-text error responses with Content-Type: text/xml. But at this point, that s just icing on the shit cake.
  2. At one point in the past, my then-employer (a hosting provider) blocked Fastly s caching servers from their network because they took down a customer site with a massive number of requests to a single resource, and the incoming request traffic was indistinguishable from a botnet-sourced DDoS attack. The requests were coming from IP space registered to a number of different ISPs, with no distinguishing rDNS (184-106-82-243.static.cloud-ips.com doesn t help me to distinguish between I m a professionally-run distributed proxy and I m a pwned box here to hammer your site into the ground ).
  3. Pretty much all of the new breed of so-called CDNs aren t actually pro-actively distributing content, they re just proxies. That isn t a bad thing, per se, but I rather dislike the far-too-common practice of installing varnish (and perhaps mod_pagespeed, if they re providing advanced capabilities) on a couple of AWS instances, and hanging out your shingle as a CDN. I prefer a bit of truth in my advertising.

6 July 2014

Dominique Dumont: Status and next step on lcdproc automatic configuration upgrade with Perl and Config::Model

Back in March, I uploaded a Debian s version of lcdproc with a unique feature: user and maintainer configurations are merged during package upgrade: user customizations and developers enhancements are both preserved in the new configuration file. (See this blog for more details). This avoids tedious edition of the configuration LCDd.conf file after every upgrade of lcdproc package. At the beginning of June, a new version of lcdproc (0.5.7-1) was uploaded. This triggered another round of automatic upgrades on user s systems. According to the popcon rise of libconfig-model-lcdproc-perl, about 100 people have upgraded lcdproc on their system. Since automatic upgrade has an opt-out feature, one cannot say for sure that 100 people are actually using automatic upgrade, but I bet a fair portion are them are. So far, only one people has complained: a bug report was filed about the many dependencies brought by libconfig-model-lcdproc-perl. The next challenge for lcdproc configuration upgrade is brought by a bug reported on Ubuntu: the device file provided by imon kernel module is a moving target: The device file created by the kernel can be /dev/lcd0 or /dev/lcd1 or even /dev/lcd2. Static configuration files and moving target don t mix well. The obvious solution is to provide a udev rule so that a symbolic link is created from a fixed location (/dev/lcd-imon) to the moving target. Once the udev rule is installed, the user only has to update LCDd.conf file to use the symlink as imon device file and we re done. But, wait The whole point of automatic configuration upgrade is to spare the user this kind of trouble: the upgrade must be completely automatic. Moreover, the upgrade must work in all cases: whether udev is available (Linux) or not. If udev is not available, the value present in the configuration file must be preserved. To know whether udev is available, the upgrade tool (aka cme) will check whether the file provided by udev (/dev/lcd-imon) is present or not. This will be done by lcdproc postinst script (which is run automatically at the end of lcdproc upgrade). Which means that the new udev rule must also be
activated in the postinst script before the upgrade is done. In other words, the next version of lcdproc (0.5.7-2) will: In the lcdproc configuration model installed by libconfig-model-lcdproc-perl, the imon device parameter is enhanced so that running cme check lcdproc or cme migrate lcdproc issues a warning if /dev/lcd-imon exists and if imon driver is not configured to use it. This way, the next installation of lcdproc will deliver a fix for imon and cme will fix user s configuration file without requiring user input. The last point is admittedly bad marketing as users will not be aware of the magic performed by Config::Model Oh well In the previous section, I ve briefly mentioned that imon_device parameter is enhanced in lcdproc configuration model. If you re not already bored, let s lift the hood and see what kind of enhancements was added. Let s peek in lcdproc configuration file, LCDd.conf file which is used to generate lcdproc configuration model. You may remember that the formal description of all LCDd.conf parameters and their properties is generated from LCDd.conf to provide lcdproc configuration model. The comments in LCDd.conf follow a convention so that most properties of the parameters can be extracted from the comments. In the example below, the comments show that NewFirmware is a boolean value expressed as yes or no, the latter being the default :
# Set the firmware version (New means >= 2.0) [default: no; legal: yes, no]
NewFirmware=no
Back to the moving target. In LCDd.conf, imon device file parameter is declared this way:
# Select the output device to use
Device=/dev/lcd0
This means that device is a string where the default value is /dev/lcd0. Which is wrong once the special udev rule provided with Debian packages is activated. With this rule, the default value must be /dev/lcd-imon. To fix this problem, a special comment is added in the Debian version of LCDd.conf to tune further the properties of the device parameter:
# select the device to use
#  %
#   default~
#   compute
#     use_eval=1
#     formula="my $l = '/dev/lcd-imon'; -e $l ? $l : '/dev/lcd0';"
#     allow_override=1 -
#   warn_if:not_lcd_imon
#     code="my $l = '/dev/lcd-imon';defined $_ and -e $l and $_ ne $l ;"
#     msg="imon device does not use /dev/lcd-imon link."
#     fix="$_ = undef;"
#   warn_unless:found_device_file
#     code="defined $_ ? -e : 1"
#     msg="missing imon device file"
#     fix="$_ = undef;"
#   - % 
Device=/dev/lcd0
This special comment between % and % follows the syntax of Config::Model::Loader. A small configuration model is declared there to enhance the model generated from LCDd.conf file. Here are the main parts: In both warn_unless and warn_if parts, the fix code snippet is run when by the command cme fix lcdproc and is used to repair the warning condition. In this case, the fix consists in resetting the device configuration value so the computed value above can be used. cme fix lcdproc is triggered during package post install script installed by dh_cme_upgrade. Come to think of it, generating a configuration model from a configuration file can probably be applied to other projects: for instance, php.ini and kdmrc are also shipped with detailed comments. May be I should make a more generic model generator from the example used to generate lcdproc model Well, I will do it if people show interest. Not in the form yeah, that would be cool , but in the form, yes, I will use your work to generate a configuration model for project [...] . I ll let you fill the blank ;-)
Tagged: Config::Model, configuration, debian, lcdproc, Perl, upgrade

15 June 2014

Dominique Dumont: Edit your debian patch header for DEP-3 compliance with cme

While not required by Debian policy, the patch tagging guidelines (aka DEP-3) recommends to add a structured header to source package patches. Long story short, patches should begin with something like :
Description: tweak lcdproc config for debian
patch LCDd.conf to:
* use syslog instead of stderr to show message
* run LCDd as root (to read /dev/lcd* file)
.
The latter could be done better by tweaking udev rule
Author: dod
Applied-Upstream: NA
Making sure that the DEP-3 recommendation is respected may not be fun. Do not despair: once libconfig-model-dpkg-perl is installed on your system, you can check whether your patches respect the DEP3 recommendation with the following command: $ cme check dpkg-patches
loading data
checking data
check done
You can also check individual patches with: cme check dpkg-patch tweak-conf Note that bash auto-completion is provided only from version 2.049 of libconfig-model-dpkg-perl package. If editing the header of your patches with your favorite editor does not strike you as fun, you can also use cme graphical editor provided by libconfig-model-tkui-perl. Once this package is installed, you can run: cme edit dpkg-patches You will get this editor to update your patches: -usr-bin-cme Dpkg::Patches_001 This editor will show you the available parameters and the associated documentation to let you provide relevant information. Last but not least, patch header check is also performed when you check your whole package with cme check dpkg All the best
Tagged: Config::Model, debian, package, patch

27 May 2014

Jon Dowland: 2012 In Review

2013 is nearly all finished up and so I thought I'd spend a little time writing up what was noteable in the last twelve months. When I did so I found an unfinished draft from the year before. It would be a shame for it to go to waste, so here it is. 2012 was an interesting year in many respects with personal highs and lows. Every year I see a lots of "round-up"-style blog posts on the web, titled things like "2012 in music", which attempt to summarize the highlights of the year in that particular context. Here's JWZ's effort, for example. Often they are prefixed with statements like "2012 was a strong year for music" or whatever. For me, 2012 was not a particularly great year. I discovered quite a lot of stuff that I love that was new to me, but not new in any other sense. In Music, there were a bunch of come-back albums that made the headlines. I picked up both of Orbital's Wonky and Brian Eno's Lux (debatably a comeback: his first ambient record since 1983, his first solo effort since 2005, but his fourth collaborative effort on Warp in the naughties). I've enjoyed them both, but I've already forgotten Wonky and I still haven't fully embraced Lux (and On Land has not been knocked from the top spot when I want to listen to ambience.) There was also Throbbing Gristle's (or X-TG) final effort, a semi/post-TG, partly posthumous double-album swan song effort which, even more than Lux, I still haven't fully digested. In all honesty I think it was eclipsed by the surprise one-off release of a live recording of a TG side project featuring Nik Void of Factory Floor: Carter Tutti Void's Transverse, which is excellent. Ostensibly a four-track release, there's a studio excerpt V4 studio (Slap 1) which is available from (at least) Amazon. There's also a much more obscure fifth "unreleased" track cruX which I managed to "buy" from one of the web shops for zero cost. The other big musical surprise for me last year was Beth Jeans Houghton and the Hooves of Destiny: Yours Truly, Cellophane Nose. I knew nothing of BJH, although it turns out I've heard some of her singles repeatedly on Radio 6, but her band's guitarist Ed Blazey and his partner lived in the flat below me briefly. In that time I managed to get to the pub with him just once, but he kindly gave me a copy of their album on 12" afterwards. It reminds me a bit of Goldfrapp circa "Seventh Tree": I really like it and I'm looking forward to whatever they do next. Reznor's How To Destroy Angels squeezed out An Omen EP which failed to set my world on fire as a coherent collection, despite a few strong songs individually. In movies, sadly once again I'd say most of the things I recall seeing would be "also rans". Prometheus was a disappointment, although I will probably rewatch it in 2D at least once. The final Batman was fun although not groundbreaking to me and it didn't surpass Ledger's efforts in The Dark Knight. Inception remains my favourite Nolan by a long shot. Looper is perhaps the stand-out, not least because it came from nowhere and I managed to avoid any hype. In games, I moaned about having moaning about too many games, most of which are much older than 2012. I started Borderlands 2 after enjoying Borderlands (disqualified on age grounds) but to this day haven't persued it much further. I mostly played the two similar meta-games: The Playstation Plus download free games in a fixed time period and the more sporadic but bountiful humble bundle whack-a-mole. More on these another time. In reading, as is typical I mostly read stuff that was not written in 2012. Of that which was, Charles Stross's The Apocalypse Codex was an improvement over The Fuller Memorandum which I did not enjoy much, but in general I'm finding I much prefer Stross's older work to his newer; David Byrne's How Music Works was my first (and currently last) Google Books ebook purchase, and I read it entirely on a Nexus 7. I thoroughly enjoyed the book but the experience has not made a convert of me away from paper. He leans heavily on his own experiences which is inevitable but fortunately they are wide and numerous. Iain Banks' Stonemouth was an enjoyable romp around a fictional Scottish town (one which, I am reliably informed, is incredibly realistical rendered). One of his "mainstream" novels, It avoided a particular plot pattern that I've grown to dread with Banks, much to my suprise (and pleasure). Finally, the stand-out pleasant surprise novel of the year was Pratchett and Baxter's The Long Earth. With a plot device not unlike Banks' Transition or Stross's Family Trade series, the pair managed to write a journey-book capturing the sense-of-wonder that these multiverse plots are good for. (Or perhaps I have a weakness for them). It's hard to find the lines between Baxter and Pratchett's writing, but the debatably-reincarnated Tibetan Monk-cum-Artificial Intelligence 'Lobsang' must surely be Pratchett's. Pratchett managed to squeeze out another non-Discworld novel (Dodger) as well as a long-overdue short story collection, although I haven't read either of them yet. On to 2013's write-up...

4 May 2014

Dominique Dumont: Deprecating experience level and preset value in Config::Model

Hello The next releases of Config::Model will deprecate 2 (mis)features: The behavior of cme check will not change. People using cme edit with the GUI will see many more configuration parameters. All the best

1 May 2014

Andrew Pollock: [life] Day 93: A whole lot of running around

This morning Zoe woke up again at around 4am and ended up in bed with me. I don't even remember why she woke up, and neither does she, but she's assuring me it won't happen tomorrow morning. We'll see. As a result of the disturbed night's sleep, we had a bit of a slow start to the day. Zoe was happy to go off and watch some TV after she woke up for the day, and that let me have a bit more of a doze before I got up, which made things vaguely better for me. ABC 4 Kids has been showing a lot of ads for Ha Ha Hairies, which airs at 10:20am, lately, and it's one of the shows that isn't available on iView. Zoe had been lamenting that she never got to see it, and asking me if she could. Today the schedule was fairly open, so I made sure we were home at 10:20am. That involved a quick dash out to Woolies first to get a few bits and pieces for Zoe's birthday party. While Zoe watched the Ha Ha Hairies, I did some bulk egg hard boiling in the oven. After the Ha Ha Hairies and a little bit of general mucking around, we drove out to Spotlight to pick up the helium tank I'd rented. Zoe nearly fell asleep on the way out there. We picked up the helium tank and headed back home. That errand alone probably took a bit over an hour all up. We got back home and had lunch, but by the time all of that was out of the way, Zoe seemed to have missed the window for her nap. She did have a bit of a rest in bed, flipping through her library books, and I got to read some of my book as well, so that was nice. During that time, I got the call from Bunkers saying they were about half an hour out with the delivery of Zoe's bunk bed (my birthday present for her). That worked out well, as it was towards the start of the two hour window they'd advised me of. The bunk bed was delivered and then we popped out to Overflow to see if they had any food covers for the party food (they did) so we picked up a few of them. Today I learned that "As Seen On TV" is trademarked, and so "Similar To As Seen On TV" is the trademark dodging thing to put on cheap knock-offs. As Overflow is two doors down from Petbarn, we stopped in there as well and grabbed some more kitty litter. One does not just pop into Petbarn with Zoe, so we spent some time there looking at the fish and assorting aquarium paraphernalia. They also had some hermit crabs now too. On the way back home, we stopped in at (a different) Woolies to pick up the half slab of chocolate mudcake that I'd ordered for Zoe's birthday cake. I'd decided that the upright Minion cake I'd initially wanted to do was just way too adventurous for my abilities and not a good use of my time (and the quotes I'd sought for outsourcing it had come in at over $300). I scaled things back to just a flat slab Minion, which may still exceed my cake decorating abilities, but we'll find out tomorrow. There'd been a miscommunication at Woolworths, and my half slab hadn't been boxed up and wasn't ready for my collection. All the bakery staff had already gone home, and I didn't really want to come back tomorrow, because I had this crazy idea of possibly starting on the cake tonight (not going to happen), so a couple of non-bakery staff had to find the cake and figure out how to extract a full slab from the baking tray and cut it in half and box it up. It provided some entertainment for Zoe and I. We finally got home, and I rehashed some frozen leftovers for dinner. I decided to try something different for the bath time and bed time routine to see if it'd reduce procrastination. I got Zoe to pick out the three books she wanted to read at bed time before we got to story time, so we'd have something concrete to negotiate with. I also threw in the possibility of a "bonus story" if she didn't muck around. It seemed to work, and we had a fairly streamlined bath time. There's no doubt she was pretty tired today, and she went to bed without any fuss, so I'm hopeful that we'll have a good night tonight.

Next.

Previous.